A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
نویسندگان
چکیده
منابع مشابه
A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.
متن کاملA New Sufficient Descent Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new conjugate conjugate method with sufficient descent property is proposed for the unconstrained optimization problem. An attractive property of the new method is that the descent direction generated by the method always possess the sufficient descent property, and this property is independent of the line search used and the choice of ki . Under mild conditions, the global c...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملModified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization
In this paper, an efficient modified nonlinear conjugate gradient method for solving unconstrained optimization problems is proposed. An attractive property of the modified method is that the generated direction in each step is always descending without any line search. The global convergence result of the modified method is established under the general Wolfe line search condition. Numerical r...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied Mathematics
سال: 2011
ISSN: 2152-7385,2152-7393
DOI: 10.4236/am.2011.29154